Extending the relevant component analysis algorithm for metric learning using both positive and negative equivalence constraints

نویسندگان

  • Dit-Yan Yeung
  • Hong Chang
چکیده

Relevant component analysis (RCA) is a recently proposed metric learning method for semi-supervised learning applications. It is a simple and efficient method that has been applied successfully to give impressive results. However, RCA can make use of supervisory information in the form of positive equivalence constraints only. In this paper, we propose an extension to RCA that allows both positive and negative equivalence constraints to be incorporated. Experimental results show that the extended RCA algorithm is effective.

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Learning a Mahalanobis Metric from Equivalence Constraints

Many learning algorithms use a metric defined over the input space as a principal tool, and their performance critically depends on the quality of this metric. We address the problem of learning metrics using side-information in the form of equivalence constraints. Unlike labels, we demonstrate that this type of side-information can sometimes be automatically obtained without the need of human ...

متن کامل

Corrigendum to "Cluster-based adaptive metric classification" [Neurocomputing 81 (2012) 33-40]

The components analysis family of algorithms learns transformations from constraints as explained in the survey of Yang [34]: ‘‘Relevant Components Analysis (RCA) [10] learns a global linear transformation from (a set of) equivalence constraints. The learned transformation can be used directly to compute (a) distance between any two examples. Discriminative Component Analysis (DCA) and Kernel D...

متن کامل

یادگیری نیمه نظارتی کرنل مرکب با استفاده از تکنیک‌های یادگیری معیار فاصله

Distance metric has a key role in many machine learning and computer vision algorithms so that choosing an appropriate distance metric has a direct effect on the performance of such algorithms. Recently, distance metric learning using labeled data or other available supervisory information has become a very active research area in machine learning applications. Studies in this area have shown t...

متن کامل

Semi-Supervised Metric Learning Using Pairwise Constraints

Distance metric has an important role in many machine learning algorithms. Recently, metric learning for semi-supervised algorithms has received much attention. For semi-supervised clustering, usually a set of pairwise similarity and dissimilarity constraints is provided as supervisory information. Until now, various metric learning methods utilizing pairwise constraints have been proposed. The...

متن کامل

LEIBNIZ CENTER FOR RESEARCH IN COMPUTER SCIENCE TECHNICAL REPORT 2003-34 Learning a Mahalanobis Metric with Side Information

Many learning algorithms use a metric defined over the input space as a principal tool, and their performance critically depends on the quality of this metric. We address the problem of learning metrics using side-information in the form of equivalence constraints. Unlike labels, we demonstrate that this type of side-information can sometimes be automatically obtained without the need of human ...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

عنوان ژورنال:
  • Pattern Recognition

دوره 39  شماره 

صفحات  -

تاریخ انتشار 2006